Sparse quantum Gaussian processes to counter the curse of dimensionality

نویسندگان

چکیده

Abstract Gaussian processes are well-established Bayesian machine learning algorithms with significant merits, despite a strong limitation: lack of scalability. Clever solutions address this issue by inducing sparsity through low-rank approximations, often based on the Nystrom method. Here, we propose different method to achieve better scalability and higher accuracy using quantum computing, outperforming classical neural networks for large datasets significantly. Unlike other approaches learning, computationally expensive linear algebra operations not just replaced their counterparts. Instead, start from recent study that proposed circuit implementing then use phase estimation induce approximation analogous in sparse processes. We provide evidence numerical tests, mathematical error bound estimation, complexity analysis can “curse dimensionality,” where each additional input parameter no longer leads an exponential growth computational cost. This is also demonstrated applying algorithm practical setting it data-driven design recently metamaterial. The algorithm, however, requires computing hardware improvements before advantage be achieved.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variable Noise and Dimensionality Reduction for Sparse Gaussian processes

The sparse pseudo-input Gaussian process (SPGP) is a new approximation method for speeding up GP regression in the case of a large number of data points N . The approximation is controlled by the gradient optimization of a small set of M ‘pseudoinputs’, thereby reducing complexity from O(N) to O(MN). One limitation of the SPGP is that this optimization space becomes impractically big for high d...

متن کامل

The curse of dimensionality

In this text, some question related to higher dimensional geometrical spaces will be discussed. The goal is to give the reader a feeling for geometric distortions related to the use of such spaces (e.g. as search spaces).

متن کامل

Overcoming the Curse of Dimensionality ?

We study the behavior of pivot-based algorithms for similarity searching in metric spaces. We show that they are eeective tools for intrinsically high-dimensional spaces, and that their performance is basically dependent on the number of pivots used and the precision used to store the distances. In this paper we give a simple yet eeective recipe for practitioners seeking for a black-box method ...

متن کامل

Privacy and the Dimensionality Curse

Most privacy-transformation methods such as k-anonymity or randomization use some kind of transformation on the data for privacy-preservation purposes. In many cases, the data can be indirectly identified with the use of a combination of attributes. Such attributes may be available from public records and they may be used to link the sensitive records to the target of interest. Thus, the sensit...

متن کامل

The Rate of Entropy for Gaussian Processes

In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Quantum Machine Intelligence

سال: 2021

ISSN: ['2524-4906', '2524-4914']

DOI: https://doi.org/10.1007/s42484-020-00032-8